Dimensionality reduction and volume minimization—generalization of the determinant minimization criterion for reduced rank regression problems

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Dimensionality Reduction and Volume Minimization – Generalization of the Determinant Minimization Criterion for Reduced Rank Regression Problems

In this article we propose a generalization of the determinant minimization criterion. The problem of minimizing the determinant of a matrix expression has implicit assumptions that the objective matrix is always nonsingular. In case of singular objective matrix the determinant would be zero and the minimization problem would be meaningless. To be able to handle all possible cases we generalize...

متن کامل

Dimensionality Reduction Based on ICA for Regression Problems

In manipulating data such as in supervised learning, we often extract new features from the original input variables for the purpose of reducing the dimensions of input space and achieving better performances. In this paper, we show how standard algorithms for independent component analysis (ICA) can be extended to extract attributes for regression problems. The advantage is that general ICA al...

متن کامل

Spectral Regression for Dimensionality Reduction∗

Spectral methods have recently emerged as a powerful tool for dimensionality reduction and manifold learning. These methods use information contained in the eigenvectors of a data affinity (i.e., item-item similarity) matrix to reveal low dimensional structure in high dimensional data. The most popular manifold learning algorithms include Locally Linear Embedding, Isomap, and Laplacian Eigenmap...

متن کامل

Nonlinear Dimensionality Reduction for Regression

The task of dimensionality reduction for regression (DRR) is to find a low dimensional representation z ∈ R of the input covariates x ∈ R, with q p, for regressing the output y ∈ R. DRR can be beneficial for visualization of high dimensional data, efficient regressor design with a reduced input dimension, but also when eliminating noise in data x through uncovering the essential information z f...

متن کامل

A Conditional Entropy Minimization Criterion for Dimensionality Reduction and Multiple Kernel Learning

Reducing the dimensionality of high-dimensional data without losing its essential information is an important task in information processing. When class labels of training data are available, Fisher discriminant analysis (FDA) has been widely used. However, the optimality of FDA is guaranteed only in a very restricted ideal circumstance, and it is often observed that FDA does not provide a good...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Linear Algebra and its Applications

سال: 2006

ISSN: 0024-3795

DOI: 10.1016/j.laa.2006.01.032